Goto

Collaborating Authors

 multivariate time series analysis


Normalizing Kalman Filters for Multivariate Time Series Analysis

Neural Information Processing Systems

This paper tackles the modelling of large, complex and multivariate time series panels in a probabilistic setting. To this extent, we present a novel approach reconciling classical state space models with deep learning methods. By augmenting state space models with normalizing flows, we mitigate imprecisions stemming from idealized assumptions in state space models. The resulting model is highly flexible while still retaining many of the attractive properties of state space models, e.g., uncertainty and observation errors are properly accounted for, inference is tractable, sampling is efficient, good generalization performance is observed, even in low data regimes. We demonstrate competitiveness against state-of-the-art deep learning methods on the tasks of forecasting real world data and handling varying levels of missing data.


Review for NeurIPS paper: Normalizing Kalman Filters for Multivariate Time Series Analysis

Neural Information Processing Systems

Additional Feedback: I very much enjoyed the paper, and I congratulate the authors on their work. The experimental results by themselves do not necessarily provide a compelling reason to use NKF over previous models. But I think the idea is an important and straight-forward one, addressing a variety of weaknesses of previous models. On the other hand, the omission of any discussion of the non-additive noise seemed problematic to me. Minor comments: [L262]: [5] is a textbook, please be more specific in the ref.


Review for NeurIPS paper: Normalizing Kalman Filters for Multivariate Time Series Analysis

Neural Information Processing Systems

We thank you for your submission. Reviewers agree that the paper is novel and of interest to the Neurips community. Please address carefully reviewers comments in your camera ready version.


Normalizing Kalman Filters for Multivariate Time Series Analysis

Neural Information Processing Systems

This paper tackles the modelling of large, complex and multivariate time series panels in a probabilistic setting. To this extent, we present a novel approach reconciling classical state space models with deep learning methods. By augmenting state space models with normalizing flows, we mitigate imprecisions stemming from idealized assumptions in state space models. The resulting model is highly flexible while still retaining many of the attractive properties of state space models, e.g., uncertainty and observation errors are properly accounted for, inference is tractable, sampling is efficient, good generalization performance is observed, even in low data regimes. We demonstrate competitiveness against state-of-the-art deep learning methods on the tasks of forecasting real world data and handling varying levels of missing data.


Improving age prediction: Utilizing LSTM-based dynamic forecasting for data augmentation in multivariate time series analysis

Gao, Yutong, Ellis, Charles A., Calhoun, Vince D., Miller, Robyn L.

arXiv.org Artificial Intelligence

While such transformations are cost-effective, deep learning models. However, the they may be constrained by the quality of the training set and neuroimaging field is notably hampered by the scarcity of may not preserve the temporal dynamics inherent in timeseries such datasets. In this work, we proposed a data augmentation data. Data augmentation can also be facilitated through and validation framework that utilizes dynamic forecasting deep learning models, such as Generative Adversarial with Long Short-Term Memory (LSTM) networks to enrich Networks (GANs), which are capable of generating synthetic datasets. We extended multivariate time series data by fMRI data [6], Additionally, training Recurrent Neural predicting the time courses of independent component Networks (RNNs) to dynamically predict future states serves networks (ICNs) in both one-step and recursive as another method for data augmentation, as demonstrated by configurations.